GeForce GTX 295 review | preview

Graphics cards 1049 Page 14 of 14 Published by

teaser

Final Words & Conclusion

The Verdict

Alright you guys and girls, I hope you enjoyed this little preview of the GeForce GTX 295. The fact is, it's really fast and therefore it can be a success. The launch will be on January 8th when CES in Las Vegas starts.

Initially a very rough calculation lead me to believe that this product will hover around 550-600 USD targeted directly against the Radeon HD 4870 X2. It's better though as NVIDIA will set the launch price very agressively at roughly 499 USD. AMD of course on their end will lower the price of their product and that my friends is the inevitable evolution of graphics technology versus competition.

No matter what game you'll play with the GeForce GTX 295, you'll play it at dazzling framerates, very high resolutions and the very best in image quality. We have shown you the performance of some pretty hot titles. Surely Left 4 Dead based on the HL-2 source engine is a pretty easy task for any modern graphics card, but the card scaled just so well. And when we look at Far Cry 2, we see more of the pretty jawbreaking performance. The same goes for Call of Duty World at War and obviously the other titles we tested. You will not have to forfeit on image quality settings and you can play in the highest resolutions. But that is of course expected. Also and I do have to mention this, the GTX 295 will be a graphics card for users with a high resolution monitor. The overall performance really starts to kick in after roughly 1920x1200, a resolution where more and more pixels need to be rendered and where GPU limitation normally kicks in pretty fast... So keep in mind that cards like these really start to show off in the higher resolutions.

Compared with the Radeon HD 4870 X2: what ATI released was and still is an incredible product. NVIDIA will finally be able to introduce a very proper answer to that card. Admittedly, ATI was able to squeeze a good amount more performance out of the X2 with their December Catalyst driver. As a result both cards were pretty close to each other in certain games. Fallout 3 was such an example where the Catalyst 8.12 driver made a good step forward in performance and Crossfire compatibility.

The winner today of course is the GeForce GTX 295 and considering we were using an early engineering sample card with a very beta driver, things are looking real good for NVIDIA.

Also despite the fact we were limited to testing a handful of games, we internally of course did run the majority of benchmarks with other games already. And the performance widespread is consistent and the card worked with any game we threw at it.

The additional value. Also as you guys know, you get the additional benefits of PhysX and additional 3rd party software that can utilize the GPU with CUDA. Think of software like Badaboom, future Photoshop GPU support and other upcoming 3rd party software.

PhysX remains a bit of a difficult topic. Mirror's Edge will probably be the decisive factor whether PhysX will make it, or fail real hard. Here in our forums I have been following a lot of conflicting opinions from you guys about PhysX. Some hate it, some love it, some don't care.  My preference is simple though, I like it for the fact alone that it adds additional effects and thus a better gaming experience in games. Whether or not PhysX, aka Physics calculations, per se should be ran over a GPU is another discussion. The GPU is without doubt better suited for the task due to its speed. A bigger problem for PhysX is that it is a closed standard for GeForce graphics cards, that leaves out ATI and as such a big chunk of the market. And that's not something that game-developers like at all. We just did learn that next the the recent announed EA and others, in the very near future more software houses are going so sign up for PhysX as well.
To get broad PhysX support, the game-industry simply wants to see an open standard for all graphics cards before they decide to invest programmers' resources and thus cold core cash into it. Something that will very likely happen with the arrival of DirectX 11, which has a compute shader that could carry physics directly into the API to the GPU. Enough of PhysX though.

We haven't talked about it really, there's also another thing that might be interesting for some of you. Granted you'll have to donate a kidney to build a PC capable of it, but quad-SLI with these 55nm GTX 200 GPU's is again a possibility. Come on, that would be interesting as we've just increased the shader processor count to 960 :) Mhhh yeah think about a good PSU for that situation before you even consider it, okay?

guru3d-toppick-150px.jpgSo there you have it. There's never a dull moment in the land of graphics technology. Personally I live and breathe for this high-end gear... It's pretty amazing where we are with graphics anno 2008, as yours truly's first review on a graphics card was roughly 10 years ago... a little 3D accelerator card called 3dfx Voodoo Graphics. Look it up on the web and compare then and now... the era of super computing over GPUs has arrived. And it's only getting better.

Early January 2009 we'll have the final review of this product ready for you guys and we'll probably have some other stuff to show you as well. I'd like to close this review with just one word: impressive.


Share this content
Twitter Facebook Reddit WhatsApp Email Print